Discontinuous Piecewise Polynomial Neural Networks
نویسنده
چکیده
An artificial neural network is presented based on the idea of connections between units that are only active for a specific range of input values and zero outside that range (and so are not evaluated outside the active range). The connection function is represented by a polynomial with compact support. The finite range of activation allows for great activation sparsity in the network and means that theoretically you are able to add computational power to the network without increasing the computational time required to evaluate the network for a given input. The polynomial order ranges from first to fifth order. Unit dropout is used for regularization and a parameter free weight update is used. Better performance is obtained by moving from piecewise linear connections to piecewise quadratic, even better performance can be obtained by moving to higher order polynomials. The algorithm is tested on the MAGIC Gamma ray data set as well as the MNIST data set.
منابع مشابه
Almost Linear VC Dimension Bounds for Piecewise Polynomial Networks
We compute upper and lower bounds on the VC dimension and pseudo-dimension of feedforward neural networks composed of piecewise polynomial activation functions. We show that if the number of layers is fixed, then the VC dimension and pseudo-dimension grow as WlogW, where W is the number of parameters in the network. This result stands in opposition to the case where the number of layers is unbo...
متن کاملEnlarging Domain of Attraction for a Special Class of Continuous-time Quadratic Lyapunov Function Piecewise Affine Systems based on Discontinuous Piecewise
This paper presents a new approach to estimate and to enlarge the domain of attraction for a planar continuous-time piecewise affine system. Various continuous Lyapunov functions have been proposed to estimate and to enlarge the system’s domain of attraction. In the proposed method with a new vision and with the aids of a discontinuous piecewise quadratic Lyapunov function, the domain of attrac...
متن کاملDiscontinuities in Recurrent Neural Networks
This article studies the computational power of various discontinuous real computational models that are based on the classical analog recurrent neural network (ARNN). This ARNN consists of finite number of neurons; each neuron computes a polynomial net function and a sigmoid-like continuous activation function. We introduce arithmetic networks as ARNN augmented with a few simple discontinuous ...
متن کاملNumerical solution of a Fredholm integro-differential equation modelling neural networks
We compare piecewise linear and polynomial collocation approaches for the numerical solution of a Fredholm integro-differential equations modelling neural networks. Both approaches combine the use of Gaussian quadrature rules on an infinite interval of integration with interpolation to a uniformly distributed grid on a bounded interval. These methods are illustrated by numerical experiments on ...
متن کاملLIMIT CYCLES FOR m–PIECEWISE DISCONTINUOUS POLYNOMIAL LIÉNARD DIFFERENTIAL EQUATIONS
We provide lower bounds for the maximum number of limit cycles for the m–piecewise discontinuous polynomial differential equations ẋ = y + sgn(gm(x, y))F (x), ẏ = −x, where the zero set of the function sgn(gm(x, y)) with m = 2, 4, 6, . . . is the product of m/2 straight lines passing through the origin of coordinates dividing the plane in sectors of angle 2π/m, and sgn(z) denotes the sign funct...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
- CoRR
دوره abs/1505.04211 شماره
صفحات -
تاریخ انتشار 2015